What you will do Gather, analyze, and interpret data to fulfill reporting requests from business stakeholders Translate complex data into clear, actionable insights that support business strategy Support ad hoc analysis and contribute to data-driven decision-making Automate routine reporting processes and maintain existing reports to ensure reliability and improve data workflows Maintain, design, and implement advanced dashboards using tools like Google Looker Studio, enabling self-service analytics across the organization Collaborate with data engineers, data scientists, and stakeholders across the organization to ensure data quality and consistency while delivering data-driven insights that support supply-related business decisions Communicate findings effectively to technical and non-technical audiences Foster a data-driven culture within the organization, promoting the use of analytics in decision-making processes Who you are You have at least 1-2 years of experience as a Data Analyst Proficiency in SQL for complex data analysis, reporting, and querying large datasets Experience with Google BigQuery Experience with data visualization tools (e.g., Looker Studio, Excel, or similar) to create compelling dashboards and reports Excellent communication skills in English Ability to translate fuzzy business requirements from diverse stakeholders into analytical requirements.
Register now and look forward to many interesting and suitable positions and projects. Technical and administrative support of the global PLM system landscape with a focus on Siemens Teamcenter and SAP S/4HANAFurther development and international rollout of CAD/PLM solutions (NX, Teamcenter) as part of the global digitalization strategyExecution of data migrations from legacy systems into new system environments as well as support during product phase-outsEnsuring smooth system integration between Teamcenter, SAP S/4HANA, and connected CAD systemsProfessional consulting and technical support for the specialist departmentsInteraction with internal and external partners for the operational implementation of global rollout projectsCreation of technical documentation, execution of tests, and training of end usersClear hands-on role – no architectural or strategic steering responsibilities Extensive experience in the administration of Siemens Teamcenter and SAP S/4HANA (hands-on focus – not an architectural or steering profile)Strong proficiency in working with CAD systems, particularly Siemens NXExperience with global PLM rollouts, system migrations, and the decommissioning of legacy systemsSolid understanding of complex integration scenarios between PLM, CAD, and ERP systemsExcellent German and English skills, both written and spokenWillingness to travel: assignments in Hungary (3–4 weeks) and the UK (2–3 weeks)Preferred: experience in an industrial environment (manufacturing industry / mechanical engineering) Partnership: comprehensive 360 degree support throughout the entire cooperation Ihr Kontakt Ansprechpartner Patrick Thomas Marolt Referenznummer 864753/1 Kontakt aufnehmen E-Mail: patrick.marolt@hays.at Anstellungsart Freiberuflich für ein Projekt
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
The Oberalp Group is a management driven family business - a house of brands that creates high quality technical mountaineering products. We have 6 own brands Salewa, Dynafit, Pomoca, Wild Country, Evolv and LaMunt. As an exclusive partner of other brands in the sports sector we offer our entire know how in communication, sales and image building.
We look forward to hearing from you. Manage and refine business and technical requirements in collaboration with stakeholdersCoordinate data integration activities with various source systemsDesign and model data structures within a Data Warehouse environment, with a strong focus on Data Vault methodologyDevelop and optimize data pipelines using SQL and PythonWork with tools like Databricks and dbt to build scalable data transformation workflowsEnsure data quality, consistency, and compliance, especially within banking-related use cases Experience in requirements managementExperience in coordination with source systemsExperience with data modeling in a Data Warehouse environment: Focus on Data VaultGood German and English language skillsDatabricks experience is nice to haveExperience with dbt (data build tool) is an advantageExperience with SQL (as a query language) and Python is an advantageBanking experience is an advantage Renowned clientRemote work Ihr Kontakt Ansprechpartner Florian Pracher Referenznummer 863466/1 Kontakt aufnehmen E-Mail: florian.pracher@hays.at Anstellungsart Freiberuflich für ein Projekt
We look forward to hearing from you. Manage and refine business and technical requirements in collaboration with stakeholders Coordinate data integration activities with various source systems Design and model data structures within a Data Warehouse environment, with a strong focus on Data Vault methodology Develop and optimize data pipelines using SQL and Python Work with tools like Databricks and dbt to build scalable data transformation workflows Ensure data quality, consistency, and compliance, especially within banking-related use cases Experience in requirements management Experience in coordination with source systems Experience with data modeling in a Data Warehouse environment: Focus on Data Vault Good German and English language skills Databricks experience is nice to have Experience with dbt (data build tool) is an advantage Experience with SQL (as a query language) and Python is an advantage Banking experience is an advantage Renowned client Remote work Ihr Kontakt Ansprechpartner Florian Pracher Referenznummer 863466/1 Kontakt aufnehmen E-Mail: florian.pracher@hays.at Anstellungsart Freiberuflich für ein Projekt
Design, build, and optimize batch data pipelines for internal tool use cases Develop efficient Spark SQL transformations for large-scale datasets Use Python for data processing, orchestration, and automation Create and maintain data models (facts, dimensions, aggregates) with clear grain and metric definitions Ensure data quality and correctness, including handling late data, duplicates, and adjustments Implement validation, data quality checks, and reconciliation logic Work with business stakeholders to gather requirements, define metrics, and translate needs into pipelines Collaborate with infrastructure teams on standards, performance tuning, and best practices Bachelor oder Master degree in a technical field or an equivalent qualification Experience in data engineering or a related field Strong proficiency in Spark SQL for large-scale data transformations Solid Python skills for data processing and pipeline development Strong understanding of data modeling (fact tables, dimensions, grain, SCDs) Hands-on experience building and maintaining batch pipelines in production High attention to detail with a strong focus on data quality and metric integrity Ability to communicate clearly with non-technical stakeholders and translate business needs into data solutions Remuneration in the most attractive collective agreement in the industry Annual leave entitlement of 30 days Generous working time account with the possibility to pay overtime Subsidization of direct insurance (as company pension scheme) Ihr Kontakt Ansprechpartner Kristina Meng Referenznummer 863942/1 Kontakt aufnehmen E-Mail: kristina.meng@hays.de Anstellungsart Anstellung bei der Hays Professional Solutions GmbH
Design, build, and optimize batch data pipelines for internal tool use casesDevelop efficient Spark SQL transformations for large-scale datasetsUse Python for data processing, orchestration, and automationCreate and maintain data models (facts, dimensions, aggregates) with clear grain and metric definitionsEnsure data quality and correctness, including handling late data, duplicates, and adjustmentsImplement validation, data quality checks, and reconciliation logicWork with business stakeholders to gather requirements, define metrics, and translate needs into pipelinesCollaborate with infrastructure teams on standards, performance tuning, and best practices Bachelor oder Master degree in a technical field or an equivalent qualificationExperience in data engineering or a related fieldStrong proficiency in Spark SQL for large-scale data transformationsSolid Python skills for data processing and pipeline developmentStrong understanding of data modeling (fact tables, dimensions, grain, SCDs)Hands-on experience building and maintaining batch pipelines in productionHigh attention to detail with a strong focus on data quality and metric integrityAbility to communicate clearly with non-technical stakeholders and translate business needs into data solutions Remuneration in the most attractive collective agreement in the industry Annual leave entitlement of 30 days Generous working time account with the possibility to pay overtime Subsidization of direct insurance (as company pension scheme) Ihr Kontakt Ansprechpartner Kristina Meng Referenznummer 863942/1 Kontakt aufnehmen E-Mail: kristina.meng@hays.de Anstellungsart Anstellung bei der Hays Professional Solutions GmbH
We support you on your journey: individual learning opportunities, world-wide job opportunities or technical training from our academy. The safety and well-being of our employees is important to us, which is why we set high standards for your workplace safety.
Register now and look forward to many interesting and suitable positions and projects. Technical and administrative support of the global PLM system landscape with a focus on Siemens Teamcenter and SAP S/4HANA Further development and international rollout of CAD/PLM solutions (NX, Teamcenter) as part of the global digitalization strategy Execution of data migrations from legacy systems into new system environments as well as support during product phase-outs Ensuring smooth system integration between Teamcenter, SAP S/4HANA, and connected CAD systems Professional consulting and technical support for the specialist departments Interaction with internal and external partners for the operational implementation of global rollout projects Creation of technical documentation, execution of tests, and training of end users Clear hands-on role – no architectural or strategic steering responsibilities Extensive experience in the administration of Siemens Teamcenter and SAP S/4HANA (hands-on focus – not an architectural or steering profile) Strong proficiency in working with CAD systems, particularly Siemens NX Experience with global PLM rollouts, system migrations, and the decommissioning of legacy systems Solid understanding of complex integration scenarios between PLM, CAD, and ERP systems Excellent German and English skills, both written and spoken Willingness to travel: assignments in Hungary (3–4 weeks) and the UK (2–3 weeks) Preferred: experience in an industrial environment (manufacturing industry / mechanical engineering) Partnership: comprehensive 360 degree support throughout the entire cooperation Ihr Kontakt Ansprechpartner Patrick Thomas Marolt Referenznummer 864753/1 Kontakt aufnehmen E-Mail: patrick.marolt@hays.at Anstellungsart Freiberuflich für ein Projekt
Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systems Support model development by assisting with training, validation, and optimization of machine learning workflows Conduct data analysis to extract insights and provide clear reports supporting R&D research questions Solve technical challenges related to data access, pipeline performance, and software limitations Ensure continuity of ongoing projects by aligning closely with the core team and delivering on timelines Perform image analysis and prepare datasets required for scientific and ML use cases Manage and improve ETL processes to ensure data quality, structure, and availability Document workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative field Strong proficiency in Python with expertise in scientific and analytical libraries Skilled in SQL and working with relational databases Understanding of ETL concepts and practical experience working with data pipelines Solid foundation in machine learning principles and model lifecycle Ability to perform image analysis for scientific or research applications Strong communication and interpersonal skills with the ability to collaborate in a technical team Independent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impact Hands-on involvement in AI, machine learning, and data integration challenges in a scientific environment Close collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
Maintain and troubleshoot data integration pipelines to ensure stable data flow into AI and analytics systemsSupport model development by assisting with training, validation, and optimization of machine learning workflowsConduct data analysis to extract insights and provide clear reports supporting R&D research questionsSolve technical challenges related to data access, pipeline performance, and software limitationsEnsure continuity of ongoing projects by aligning closely with the core team and delivering on timelinesPerform image analysis and prepare datasets required for scientific and ML use casesManage and improve ETL processes to ensure data quality, structure, and availabilityDocument workflows, pipeline changes, and analytical steps to ensure clarity and reproducibility Academic background in computer science, data science, engineering, or a related quantitative fieldStrong proficiency in Python with expertise in scientific and analytical librariesSkilled in SQL and working with relational databasesUnderstanding of ETL concepts and practical experience working with data pipelinesSolid foundation in machine learning principles and model lifecycleAbility to perform image analysis for scientific or research applicationsStrong communication and interpersonal skills with the ability to collaborate in a technical teamIndependent, structured problem-solver with a commitment to clear documentation and FAIR data practices Opportunity to contribute directly to active R&D projects with immediate real-world impactHands-on involvement in AI, machine learning, and data integration challenges in a scientific environmentClose collaboration with a small, highly skilled technical team Ihr Kontakt Referenznummer 863771/1 Kontakt aufnehmen Telefon:+41 44 225 50 00 E-Mail: positionen@hays.ch Anstellungsart Freiberuflich für ein Projekt
. --------------------------------------------- Senior Data Scientist Data Science & Advanced Analytics Frankfurt, Germany The Senior Data Scientist is responsible for designing, deploying and continuously optimizing scalable ML solutions that translate business requirements into measurable impact across EMEA markets. The position combines technical execution with business alignment, ensuring solutions remain adaptable to evolving commercial needs while maintaining architecture consistency and standardization.
HPA - Hamburg Port Authority AöR sucht in eine/n Data Governance Manager*in in der Technical Division (ID-Nummer: 13745001)
What you can expect You take on the functional and disciplinary responsibility for all FTEs in the Sales Data Hub within the federated data setup, including Data Engineers, Data Scientists, Data Governance roles and Product Owners Data.You define, design, develop, and operate cloud‑based data products for the Sales, Marketing, and Customer Service business units.You are responsible for the methodological integration of data assets and data products.You manage the Sales Data Hub operationally and further develop it as a specialized unit for data‑driven solutions in Sales, Marketing, and Customer Service.You are responsible for the further development and ongoing maintenance of all sales‑related models based on feedback and requirements from the sales organization.You lead projects related to planning, expanding, and organizing new and existing products in collaboration with the relevant business units and external partners.You assume technical responsibility for data products developed by or for the Sales, Marketing, and Customer Service areas within the Data Intelligence & Analytics team.You drive the continuous expansion, professionalization, and organizational development of the Sales Data Hub within the existing governance and organizational framework.
WHAT YOU WILL DO Provide consultation to ITS, business and third parties on major and highly complex opportunities Identify opportunities to apply technical specialism more effectively within employing and closely associated organisations Be up-to-date of current developments and carry out specific assignments related to business specialisms Monitor and oversee the overall performance of services which fall within your responsibility Drive major change initiatives or business opportunities related to your technical specialism Assist with quality assurance of activities and educate less experienced colleagues in related areas.
Responsible for all aspects of Material Master for the global template and for MDG-M Review all Requirements for clarity and for support of harmonized approach Drive, oversee, guide and assist the technical developers Test all requirement implementations Involved in all aspects of PLM Interfaces for each of the PLM Systems With each roll out the above topics must be addressed in different ways including migration With regards to migration: manage clones, collisions, duplicates for every S4E-Unify and Agora Roll Out Provide demonstrations on MDG-M functionality, oversee the Data Quality module and its implementation relative to KPI’s and management of KPI results Bachelor’s degree in computer science, Business Management, Management Information Systems or equivalent experience Experience in business process design in material management and production planning Expert knowledge about PP, MM, MDG-M, processes and customizing in SAP S/4 HANA Experience and the ability to read ABAP code / debugging are desirable Ability to meet business requirements through the standard solution used Excellent with troubleshooting and analytical skills 10-20% business travel (international and domestic) is expected Languages: English, German (optional) Exciting Global Projects: Be part of a major international SAP S/4HANA transformation program Hybrid Work Model: Flexible working hours and remote work options depending on project needs Professional Development: Opportunity to work with cutting-edge technologies and gain experience in SAP S/4HANA, MDG-M, and PLM interfaces International Environment: Collaborate with global teams across different countries and cultures Modern Workplace: Access to Siemens Energy’s innovative and sustainable office infrastructure Networking Opportunities: Work alongside experts in energy, digitalization, and transformation Gehaltsinformationen 120000 Ihr Kontakt Ansprechpartner Julian Hientz Referenznummer 847518/1 Kontakt aufnehmen E-Mail: julian.hientz@hays.de Anstellungsart Anstellung bei der Hays Professional Solutions GmbH
Responsible for all aspects of Material Master for the global template and for MDG-M Review all Requirements for clarity and for support of harmonized approach Drive, oversee, guide and assist the technical developers Test all requirement implementations Involved in all aspects of PLM Interfaces for each of the PLM Systems With each roll out the above topics must be addressed in different ways including migration With regards to migration: manage clones, collisions, duplicates for every S4E-Unify and Agora Roll Out Provide demonstrations on MDG-M functionality, oversee the Data Quality module and its implementation relative to KPI’s and management of KPI results Bachelor’s degree in computer science, Business Management, Management Information Systems or equivalent experience Experience in business process design in material management and production planning Expert knowledge about PP, MM, MDG-M, processes and customizing in SAP S/4 HANA Experience and the ability to read ABAP code / debugging are desirable Ability to meet business requirements through the standard solution used Excellent with troubleshooting and analytical skills 10-20% business travel (international and domestic) is expected Languages: English, German (optional) Exciting Global Projects: Be part of a major international SAP S/4HANA transformation program Hybrid Work Model: Flexible working hours and remote work options depending on project needs Professional Development: Opportunity to work with cutting-edge technologies and gain experience in SAP S/4HANA, MDG-M, and PLM interfaces International Environment: Collaborate with global teams across different countries and cultures Modern Workplace: Access to Siemens Energy’s innovative and sustainable office infrastructure Networking Opportunities: Work alongside experts in energy, digitalization, and transformation Gehaltsinformationen 120000 Ihr Kontakt Ansprechpartner Julian Hientz Referenznummer 847518/1 Kontakt aufnehmen E-Mail: julian.hientz@hays.de Anstellungsart Anstellung bei der Hays Professional Solutions GmbH
You are responsible for the conceptual, logical, and structural integrity of our Core Data Model as well as the Gold Layer across Azure, Snowflake, and dbt.You ensure that fragmented data sources are transformed into consistent, reusable, and decision‑relevant data products, actively preventing the platform from drifting into team‑specific, incompatible models.You define and maintain central business objects, canonical dimensions, shared metrics, and facts, ensuring that the Core Data Model serves as a stable, business‑oriented foundation across all domains.You develop modeling standards, naming conventions, layering concepts (Staging → Intermediate → Gold), reuse patterns, and dbt design guidelines, and you ensure their consistent implementation across all teams.You safeguard the semantic consistency of the entire data model, resolve domain conflicts, ensure that identical business terms are modeled only once, and review changes affecting core layers.You act as the technical design authority for model changes in Snowflake/dbt, balancing local requirements with long‑term model coherence, and ensuring that all models remain performant, scalable, maintainable, and of high quality.
What You’ll Do: Collaborate in an Agile, International TeamWork closely with colleagues from Romania, Germany, and UkraineDesign, estimate, develop, and implement software solutions aligned with business needsActively communicate progress, risks, and technical decisions to stakeholdersBuild Scalable Data SolutionsDevelop agnostic data products within a modern, cloud-native data ecosystemSupport use cases across BI, Advanced Analytics, AI, and MLTranslate business requirements into robust technical architecturesContinuously enhance performance, quality, and cost-efficiency of solutionsProactively suggest improvements and best practices What makes you stand out Degree in Computer Science, Economics, or a comparable qualificationMinimum 3 years of experience as a BI Engineer or Data Engineer, focused on cloud-based architecturesStrong expertise in: Snowflake and DBT (Data Build Tool)Solid knowledge of: SQL and Data lakehouse architectures, Python is nice to haveCommunication is Key Excellent communication skills in English (written and spoken) — mandatoryAbility to clearly explain technical concepts to both technical and non-technical stakeholdersStrong stakeholder management and collaboration skillsComfortable working in cross-border, multicultural teams We are looking forward to your application and to applicants who enrich our diverse culture!
What You’ll Do Drive Measurable Business ImpactIndependently lead AI and analytics initiatives that generate tangible business valueActively contribute to and influence the company’s strategic directionApply Advanced Analytics & AIDevelop and apply advanced statistical methods in an agile environmentWork on business-critical questions using: Regression models, Time series analysis and Machine learning & AI algorithmsCollaborate cross-functionally or drive initiatives independentlyTurn Data into ActionDesign and execute analyses on large datasetsTranslate findings into clear, actionable recommendationsWork across the full spectrum — from Excel-based analysis to deep learning models Build Scalable AI SolutionsDevelop and own customized Data Science and AI solutionsWork with SQL on Azure and Snowflake platforms, Python is nice to haveLead projects end-to-end: from Proof of Concept (PoC) to fully operational production modelsCreate audience-tailored presentationsTranslate complex analytical insights into clear business languageDeliver compelling data storytelling to support decision-making at all levels What makes you stand out Minimum 3 years of experience in Data Science, AI, Advanced Analytics, or similar rolesProven ability to generate measurable business value through data-driven solutionsStrong expertise in statistical modeling and machine learning techniquesHands-on experience with: Python, SQL, Azure and SnowflakeExperience building scalable models from PoC to productionStrong communication and stakeholder management skillsAbility to explain complex topics to both technical and non-technical audiencesBachelor’s or Master’s degree in Finance, Statistics, Computer Science, Mathematics, or a related quantitative field We are looking forward to your application and to applicants who enrich our diverse culture!
Stadtverwaltung Jena sucht in eine/n Technical Product Owner Microsoft 365 Umgebung (m/w/d) (ID-Nummer: 13667712)
What makes you stand out You hold a degree in Business Informatics, Data Science, Computer Science, Industrial Engineering, Business Administration, or a comparable field.You bring experience in Revenue Operations, Sales Operations, Sales Analytics, or a similar commercial analytics role.You have a strong understanding of sales processes, pipeline management, forecasting, and revenue metrics, and you can translate them into technical requirements for engineers.You hold a degree in Business Informatics, Data Science, Computer Science, Industrial Engineering, Business Administration, or a comparable field.You are a great stakeholder manager who can talk to sales and engineering alike.You are highly proficient in Power BI and experienced in building dashboards that drive action.You are comfortable working with CRM data and sales systems (e.g.
Collaborate with data scientists and engineers to break down abstract business challenges into a defined product vision, MVPs, and iterative delivery plans. Act as a bridge between technical and business teams, simplifying complex technical concepts and data insights, making them understandable for stakeholders. You’ll also translate business needs into technical requirements.
Key Responsibilities Develop and maintain detailed project budgets, cost estimates, and financial forecasts for data center construction and infrastructure projects, tracking expenditures against approved budgetsPrepare comprehensive cost reports, cash flow projections, and value engineering analyses to optimize project costs while maintaining quality standards and technical requirementsCoordinate with project managers, contractors, and procurement teams to evaluate contract proposals, change orders, and variation requests while ensuring cost-effective project deliveryConduct risk assessments for cost implications, develop contingency strategies, and monitor market conditions affecting material and labor costs in data center constructionReview and validate contractor invoices, progress payments, and final accounts while maintaining detailed cost documentation and audit trails for financial complianceSupport procurement processes, tender evaluations, and contract negotiations to achieve optimal value while facilitating cost reconciliation, lessons learned, and knowledge transfer for future project Qualifications & Skills Bachelor's degree in Quantity Surveying, Construction Management, Engineering, or related field with several years of experience in cost management for data center or mission-critical facility projectsComprehensive knowledge of data center construction costs, industry pricing trends, and ability to interpret technical specifications for accurate cost estimation and budget developmentStrong financial analysis expertise with proficiency in cost management software, spreadsheet applications, and demonstrated ability to prepare detailed cost reports and forecastsExcellent analytical, communication, and stakeholder management skills with proven ability to negotiate with contractors and suppliers while managing cost-related project risksExperience with value engineering, life-cycle costing, and cost optimization techniques in complex construction environments with understanding of procurement processes and contract administration Jones Lang LaSalle SE Human Resources Ihr Ansprechpartner: Jan Bauermann Talent Acquisition Partner EMEA jan.bauermann@jll.com Location: On-site –Frankfurt am Main, DEU If this job description resonates with you, we encourage you to apply even if you don’t meet all of the requirements.
Design and implement a SQL-based landing zone for regulatory data Develop stored procedures for transformation, enrichment, and aggregation Build and operate high-volume batch processing chains for monthly/quarterly cycles Implement SSIS-based ingestion flows and job orchestration Ensure data quality, technical lineage, and full traceability across layers Define and document integration patterns and mapping logic between landing-zone datasets and Tagetik-based reporting templates Perform operational monitoring, troubleshooting, and performance optimization Strong expertise in Microsoft SQL Server and T-SQL Hands-on experience with stored-procedure-driven ETL and complex data models Solid SSIS skills for orchestration and control of processing chains Experience with batch processing, logging, restartability, and performance tuning Knowledge of data lineage, reconciliation, and regulatory processing needs Experience with reporting platforms such as Tagetik is a plus Familiarity with Oracle source systems is advantageous Renowned Client Remote Option Ihr Kontakt Ansprechpartner Eliška Stejskalová Referenznummer 862801/1 Kontakt aufnehmen E-Mail: eliska.stejskalova@hays.at Anstellungsart Freiberuflich für ein Projekt
Nordmark Pharma GmbH sucht in eine/n Werkstudent (m/w/d) im Technical Support Teilzeit für 20 Stunden / Woche, befristet für max. 7 Monate (ID-Nummer: 13747921)
. · Provide guidance and support to architects and developers on documentation standards. · Ensure consistency and clarity across technical documentation for integration and migration projects. Dein Profil als Documentation Specialist (m/w/d) am Standort Erlangen Strong experience in technical documentation within software development or system integration projects.
. · Provide guidance and support to architects and developers on documentation standards. · Ensure consistency and clarity across technical documentation for integration and migration projects. Dein Profil als Documentation Specialist (m/w/d) am Standort Erlangen Strong experience in technical documentation within software development or system integration projects.
SUMMARY We are seeking for an Application Support Specialist to join our team. You will provide consultation, guidance and technical expertise with respect to leading-edge technologies. You will drive solutions of complex migrations & integration opportunities and transition programs.
Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup. Create technical documentation. Analyze and decompose business requirements into technical functionalities. Produce clean and efficient code based on business requirements and specifications.
Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup. Create technical documentation. Analyze and decompose business requirements into technical functionalities. Produce clean and efficient code based on business requirements and specifications.
Fresenius Medical Care Deutschland GmbH sucht in eine/n (Senior) Data Privacy Engineer (m/f/d) / (Senior) Expert (m/f/d) – Technical Privacy (ID-Nummer: 13676634)
AUGENTIC GmbH sucht in eine/n Technical Manager (m/f/d) (ID-Nummer: 13757550)
What you will do: Development and evaluation of statistical models and algorithms for complex marketing issues Independent analysis of complex data with the aim of identifying new insights and potential for performance optimization Identifying direct and indirect correlations between relevant key figures and deriving recommendations for action Linking and using the content of data from tracking systems and other reporting sources Support in the further development and testing of performance-relevant (attribution) models Initiation and further development of prediction and classification models using machine learning algorithms Who you are: You bring at least seven years of hands-on experience in Data Engineering, ideally in an agency, e-commerce, or performance-driven environment You have initial experience with machine learning algorithms and a solid understanding of common data analysis methods such as regression and clustering; knowledge of marketing attribution models is a strong plus You are proficient in SQL and either Python or R (both are a bonus) Experience with Dagster or comparable data orchestration tools is highly appreciated You are naturally curious, enjoy exploring new topics, statistical methods, and emerging technologies, and stay up to date with current technical developments Benefits Hybrid working Täglich frisches Obst Sportkurse Freier Zutritt zur code.talks Exklusive Mitarbeiter Rabatte Kostenlose Getränke Sprachkurse Kostenloser Laracasts Account Company Events Relocation Unterstützung Mobilitätszuschlag State-of-the-art Technologien Zentrale Lage Flexible Arbeitszeiten Betriebliche Altersvorsorge Weiterbildungs- angebote Hunde erlaubt AY Academy Feedbackkultur Firmenfahrrad YOU ARE THE CORE OF ABOUT YOU.
Design and implement a SQL-based landing zone for regulatory dataDevelop stored procedures for transformation, enrichment, and aggregationBuild and operate high-volume batch processing chains for monthly/quarterly cyclesImplement SSIS-based ingestion flows and job orchestrationEnsure data quality, technical lineage, and full traceability across layersDefine and document integration patterns and mapping logic between landing-zone datasets and Tagetik-based reporting templatesPerform operational monitoring, troubleshooting, and performance optimization Strong expertise in Microsoft SQL Server and T-SQLHands-on experience with stored-procedure-driven ETL and complex data modelsSolid SSIS skills for orchestration and control of processing chainsExperience with batch processing, logging, restartability, and performance tuningKnowledge of data lineage, reconciliation, and regulatory processing needsExperience with reporting platforms such as Tagetik is a plusFamiliarity with Oracle source systems is advantageous Renowned ClientRemote Option Ihr Kontakt Ansprechpartner Eliška Stejskalová Referenznummer 862801/1 Kontakt aufnehmen E-Mail: eliska.stejskalova@hays.at Anstellungsart Freiberuflich für ein Projekt
What you will do: Development and evaluation of statistical models and algorithms for complex marketing issues Independent analysis of complex data with the aim of identifying new insights and potential for performance optimization Identifying direct and indirect correlations between relevant key figures and deriving recommendations for action Linking and using the content of data from tracking systems and other reporting sources Support in the further development and testing of performance-relevant (attribution) models Initiation and further development of prediction and classification models using machine learning algorithms Who you are: You bring at least two years of hands-on experience in Data Engineering, ideally in an agency, e-commerce, or performance-driven environment You have initial experience with machine learning algorithms and a solid understanding of common data analysis methods such as regression and clustering; knowledge of marketing attribution models is a strong plus You are proficient in SQL and either Python or R (both are a bonus) Experience with Dagster or comparable data orchestration tools is highly appreciated You are naturally curious, enjoy exploring new topics, statistical methods, and emerging technologies, and stay up to date with current technical developments Additional information: **Working model: Due to the upcoming tasks and responsibilities for this position, it is required to work onsite at our headquarters in Hamburg or Berlin on a weekly basis.
#shapeandcreate Your responsibilities: Conception, design, and implementation of technical solutions in the SAP S/4HANA Finance environment.Customization and further development of the SAP S/4HANA Finance system.Development of ABAP/ABAP OO programs for the automation and optimization of business processes.Analysis of business requirements and creation of technical specifications.Close cooperation with specialist departments and other IT teams.Testing and documentation of the developed solutions.Error analysis and troubleshooting in the SAP S/4HANA Finance system.
elasticsearch AWS Python Google BigQuery Google Cloud Platform Numpy Pandas Gitlab What you will do Design and develop innovative algorithms to power a personalized shopping experience, leveraging cutting-edge machine learning techniques Deploy your solutions into production, taking full ownership and ensuring high performance and scalability Combine your data science expertise with a pragmatic, agile approach to find innovative solutions and drive measurable results within a fast-paced environment Challenge the status quo by identifying areas for improvement in existing retrieval and reranking systems, particularly those relying heavily on business logic, and propose data-driven solutions Thrive in a dynamic, fast-paced environment with a flat hierarchy, where your ideas and contributions can make a real difference Who you are Proficiency in Python or experience with at least one scientific computing language (e.g., MATLAB, R, Julia, C++) Strong SQL skills with experience in analytical or transactional database environments Theoretical understanding of machine learning principles, coupled with a hands-on approach to building and iterating on models Proven experience in building and deploying machine learning solutions that deliver tangible business value Strong understanding of data structures, algorithms, and tools for efficiently handling large datasets (e.g. pandas, numpy, dask, arrow, polars, …) Experience designing, building, and managing data pipelines Familiarity with cloud-based model training and serving platforms (e.g., GCP Vertex AI, Amazon SageMaker) Solid understanding of statistical methods for model evaluation Big Data: Experience analyzing large datasets using statistical and machine learning techniques DevOps: Familiarity with CI/CD tools (e.g., GitLab CI/CD, Hashicorp Terraform) is a plus Generative AI: Experience with generative AI and agentic frameworks (e.g., LangChain, ADK, CrewAI, Pydantic AI, …) is a plus Understanding of recommendation, retrieval and reranking systems in e-commerce and retail is a plus Excellent written and verbal communication skills in English Ability to effectively communicate complex machine learning concepts to both technical and non-technical stakeholders Proven ability to collaborate effectively within a team to establish standards and best practices for deploying machine learning models A proactive approach to knowledge sharing and fostering a quick development environment Nice to have Experience with BigQuery Knowledge of time series and (graph) neural network models Familiarity with statistical testing and Gaussian Processes Strong Knowledge of Computer Vision libraries, (e.g.
Responsibilities Lead end-to-end implementation and migration projects for enterprise clients Define and document target architectures, integrating SCAYLE with client systems (ERP, PIM, CRM) Provide guidance on SCAYLE platform capabilities, APIs, and composable commerce Translate business requirements into technical specifications and actionable work packages Collaborate with external partners to ensure delivery quality and standards compliance Review and validate integrations: REST APIs, webhooks, and asynchronous data flows Support cross-functional teams in resolving technical blockers Requirements Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field 5+ years in enterprise ecommerce architecture, digital consulting, or technical lead roles Proven experience leading large-scale SaaS or MACH replatforming projects Deep knowledge of e-commerce architectures: Storefronts, OMS, PIM, CMS, ERP ntegrations Expert in MACH principles (Microservices, API-first, Cloud-native, Headless) Hands-on experience with REST APIs, webhooks, and event-driven architectures Strong problem-solving, consultative mindset, and clear communication skills YOU ARE THE CORE OF OUR COMPANY We take responsibility for creating an inclusive and exceptional environment where all genders, nationalities and ethnicities feel welcomed and accepted exactly as they are.
YOUR TASKS: Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup Develop solutions on a leading-edge cloud based platform for managing and analyzing large datasets Create technical documentation Analyze and decompose business requirements into technical functionalities Produce clean and efficient code based on business requirements and specifications Create Notebooks, pipelines and workflows in SCALA or Python to ingest, process and serve data in our platform Be a technical lead for junior and external developers Be a part of the continuous improvement of Nordex’ development processes by participating in retrospectives and proposing optimizations YOUR PROFILE: Technical degree in Computer Science, Software Engineering or comparable Experience or certification in Databricks Fluent English At least 3 years of proven experience Availability to travel YOUR BENEFITS: In addition to the opportunity to make our world a little more sustainable, we offer you: *Some offers may vary by location. ** Hybrid working in accordance with the company's internal policy.
YOUR TASKS: Design, develop and deploy digital solutions ensuring the software development life cycle in an agile setup Develop solutions on a leading-edge cloud based platform for managing and analyzing large datasets Create technical documentation Analyze and decompose business requirements into technical functionalities Produce clean and efficient code based on business requirements and specifications Create Notebooks, pipelines and workflows in SCALA or Python to ingest, process and serve data in our platform Be a technical lead for junior and external developers Be a part of the continuous improvement of Nordex’ development processes by participating in retrospectives and proposing optimizations YOUR PROFILE: Technical degree in Computer Science, Software Engineering or comparable Experience or certification in Databricks Fluent English At least 3 years of proven experience Availability to travel YOUR BENEFITS: In addition to the opportunity to make our world a little more sustainable, we offer you: *Some offers may vary by location. ** Hybrid working in accordance with the company's internal policy.
Mein Arbeitgeber Als erfolgreicher, wachsender mittelständischer Anbieter relevanter und innovativer daten- und KI-getriebener Lösungen zur Optimierung der Entscheidungsfindung für namhafte Unternehmen bietet dieser Arbeitgeber eine aufgeschlossene, agile und von Fortschritt geprägte Kultur, in der Teamwork, Eigenverantwortung und kontinuierliches Lernen von hoher Bedeutung sind Gestaltungsspielraum in anspruchsvollen Projekten, langfristige Perspektiven und Flexibilität sind inklusive Konzeption, Umsetzung und Weiterentwicklung von Generative-AI- und NLP-Systemen, die Kundenanforderungen in Bezug auf Performance, Latenz, Kosten und Erweiterbarkeit optimal erfüllenRegelmäßige Abstimmung mit Stakeholdern bzgl. der Anforderungen Enge Zusammenarbeit mit EntwicklerInnen und Technical Leads für die Umsetzung und Use Cases wie Retrieval-basierten Chatbots, Agentensystemen oder Fine-Tuning von SprachmodellenPlanung und Implementierung robuster Machine Learning-Pipelines nach Best Practices - auf Azure, AWS oder GCP Input bei komplexen technischen Herausforderungen sowie Präsentation von LösungsansätzenAktive Verfolgung neuer Entwicklungen in NLP und KI, um KundInnen stets moderne, hochwertige Lösungen anzubieten Erfolgreich abgeschlossenes Studium der (Wirtschafts-)Informatik oder eine vergleichbare AusbildungEinschlägige Berufserfahrung in Data Science und idealerweise im Umgang mit mehreren der typischen NLP-/LLM-Tools wie OpenAI APIs, Bedrock, Azure AI Foundry, LangChain, LangGraph, Instructor, Hugging Face, Tokenizers, Vektordatenbanken, performanter Inferenz, Model Deployment, MCP/A2A und Datensatz-ErstellungSehr gute Kenntnisse in Machine Learning und Deep Learning, insbesondere im Bezug auf Transformer-Modelle, LLMs und Generative KISicherer Umgang mit produktionsreifen Frameworks wie PyTorch sowie mit Agent-Frameworks wie LangGraph, SmolAgents, OpenAI Agent SDK, CrewAI oder PydanticAITiefes Verständnis der Model-Optimierung mit PEFT wie QLoRA, Instruction Fine-Tuning, Post-Training, Inference-Optimierung und Embeddings Sicher in Workflows wie Conversational AI, RAG, Info Extraction, Tool Calling, LLM-EvaluationKenntnisse in Agentic RAG, GraphRAG, Multi-Agent-Systemen, Text-to-SQL und Code RetrievalSehr gutes Verständnis für Deployments und MLOps auf Azure, GCP oder AWSHoher Anspruch an Softwarequalität und die Fähigkeit, sauberen, performanten, skalierbaren Code zu schreiben und KI-Systeme produktiv einzusetzenDie Fähigkeit, komplexe Anforderungen in technische Lösungen zu übersetzen und auch Fachfremden sicher zu vermitteln sowie gute Deutsch- und Englischkenntnisse Eigenverantwortliches Arbeiten und Mitgestaltungsmöglichkeiten durch kurze EntscheidungswegeWeiterentwicklung durch Fokus auf Innovation: Die vielfältigen und abwechslungsreichen Projekte drehen sich um die Entwicklung intelligenter Algorithmen, datenbasierter Strategien und maßgeschneiderter KI-LösungenEngagiertes, dynamisches, konstruktives Team mit starkem Zusammenhalt und einer offenen Feedback-KulturSehr gut angebundene, moderne Räumlichkeiten und hochwertiges technisches EquipmentFlexibel planbarer remote-Anteil von bis zu 40%, sogar zweitweise aus dem EU-AuslandZuschuss zum Deutschland-Ticket, zu Sport- und Wellnessangeboten sowie zur Kinderbetreuung Gehaltsinformationen Erfahrungsabhängig bis zu 100.000 € p.a.
SMS group GmbH sucht in eine/n Technical System Manager - Software Process Automation (w/m/d) (ID-Nummer: 13595828)
Vossloh AG sucht in eine/n Technical Project Manager SAP S/4 HANA Transformation (m/w/d) (ID-Nummer: 13710565)
Mein Arbeitgeber Als erfolgreicher, wachsender mittelständischer Anbieter relevanter und innovativer daten- und KI-getriebener Lösungen zur Optimierung der Entscheidungsfindung für namhafte Unternehmen bietet dieser Arbeitgeber eine aufgeschlossene, agile und von Fortschritt geprägte Kultur, in der Teamwork, Eigenverantwortung und kontinuierliches Lernen von hoher Bedeutung sind Gestaltungsspielraum in anspruchsvollen Projekten, langfristige Perspektiven und Flexibilität sind inklusive Konzeption, Umsetzung und Weiterentwicklung von Generative-AI- und NLP-Systemen, die Kundenanforderungen in Bezug auf Performance, Latenz, Kosten und Erweiterbarkeit optimal erfüllen Regelmäßige Abstimmung mit Stakeholdern bzgl. der Anforderungen Enge Zusammenarbeit mit EntwicklerInnen und Technical Leads für die Umsetzung und Use Cases wie Retrieval-basierten Chatbots, Agentensystemen oder Fine-Tuning von Sprachmodellen Planung und Implementierung robuster Machine Learning-Pipelines nach Best Practices - auf Azure, AWS oder GCP Input bei komplexen technischen Herausforderungen sowie Präsentation von Lösungsansätzen Aktive Verfolgung neuer Entwicklungen in NLP und KI, um KundInnen stets moderne, hochwertige Lösungen anzubieten Erfolgreich abgeschlossenes Studium der (Wirtschafts-)Informatik oder eine vergleichbare Ausbildung Einschlägige Berufserfahrung in Data Science und idealerweise im Umgang mit mehreren der typischen NLP-/LLM-Tools wie OpenAI APIs, Bedrock, Azure AI Foundry, LangChain, LangGraph, Instructor, Hugging Face, Tokenizers, Vektordatenbanken, performanter Inferenz, Model Deployment, MCP/A2A und Datensatz-Erstellung Sehr gute Kenntnisse in Machine Learning und Deep Learning, insbesondere im Bezug auf Transformer-Modelle, LLMs und Generative KI Sicherer Umgang mit produktionsreifen Frameworks wie PyTorch sowie mit Agent-Frameworks wie LangGraph, SmolAgents, OpenAI Agent SDK, CrewAI oder PydanticAI Tiefes Verständnis der Model-Optimierung mit PEFT wie QLoRA, Instruction Fine-Tuning, Post-Training, Inference-Optimierung und Embeddings Sicher in Workflows wie Conversational AI, RAG, Info Extraction, Tool Calling, LLM-Evaluation Kenntnisse in Agentic RAG, GraphRAG, Multi-Agent-Systemen, Text-to-SQL und Code Retrieval Sehr gutes Verständnis für Deployments und MLOps auf Azure, GCP oder AWS Hoher Anspruch an Softwarequalität und die Fähigkeit, sauberen, performanten, skalierbaren Code zu schreiben und KI-Systeme produktiv einzusetzen Die Fähigkeit, komplexe Anforderungen in technische Lösungen zu übersetzen und auch Fachfremden sicher zu vermitteln sowie gute Deutsch- und Englischkenntnisse Eigenverantwortliches Arbeiten und Mitgestaltungsmöglichkeiten durch kurze Entscheidungswege Weiterentwicklung durch Fokus auf Innovation: Die vielfältigen und abwechslungsreichen Projekte drehen sich um die Entwicklung intelligenter Algorithmen, datenbasierter Strategien und maßgeschneiderter KI-Lösungen Engagiertes, dynamisches, konstruktives Team mit starkem Zusammenhalt und einer offenen Feedback-Kultur Sehr gut angebundene, moderne Räumlichkeiten und hochwertiges technisches Equipment Flexibel planbarer remote-Anteil von bis zu 40%, sogar zweitweise aus dem EU-Ausland Zuschuss zum Deutschland-Ticket, zu Sport- und Wellnessangeboten sowie zur Kinderbetreuung Gehaltsinformationen Erfahrungsabhängig bis zu 100.000 € p.a.
Impact & Team Power: High degree of creative freedom and unbeatable team power in a high performance work environment Salary & Employee Discounts: Secure job with the pioneer of discounts offering an attractive salary and corporate benefits Your tasks Lead the SAP development within the S/4HANA transformation program, functional developments and operational business Ensure all custom developments comply with SAP Clean Core principles and standardized development guidelines Design and implement scalable, maintainable custom solutions using ABAP, CDS Views, OData services, and Fiori/UI5 Collaborate with solution architects and challenge technical designs proposals Oversee integration scenarios leveraging SAP BTP services (CAP, Integration Suite, Workflow Management) Establish and enforce development best practices, code reviews, and quality assurance processes Support migration activities and ensure compatibility with future SAP upgrades Act as a technical mentor for development teams and coordinate offshore/nearshore resources Your profile Degree in Business Informatics or a comparable qualification Several years of experience in SAP development (ABAP, CDS, Fiori/UI5) Experience in SAP S/4HANA transformation projects and custom development governance Familiarity with SAP BTP services and cloud integration patterns as well as experience with SAP CAP (Cloud Application Programming Model) Ability to critically assess architecture proposals and provide sustainable alternatives Knowledge of API management and event driven architectures Familiarity with DevOps tools and CI/CD pipelines for SAP environments Strong knowledge of performance optimization and secure coding practices Very good German & English language skills, both spoken and written SAP-Developer-Core-Platform-m-f-d-Essen
LBBW Landesbank Baden-Württemberg sucht in eine/n Technical Expert Ab Initio (m/w/d) (ID-Nummer: 13718984)
Working in an interdisciplinary team of engineers to develop and improve designs and manufacturing processes for thick film sensors Improve and maintain the data infrastructure and pipeline for production and process control data from various sources and ensure timely data availability Act as a technical interface between R&D and Production and between various R&D departments to harmonize data handling and standards Improve and maintain data visualization tools (dashboards, interactive charts) and support in routine data analysis Support in defining and improving image analysis methods and tools to derive quantitative feature values from images Extend the data infrastructure with additional information, e.g. from sensor performance characterization Data driven improvements of manufacturing processes Completed technical training in process engineering, data science, bioinformatics, or similar professional education Professional experience in industrial R&D or manufacturing environment, ideally in the medical device industry or a comparable regulated environment Experience in building and maintaining data pipelines (ETL processes) from diverse sources such as SQL databases, CSV, and machine log files Ability to create interactive dashboards and visualization tools with a solid understanding of applied statistics (e.g. correlation analysis, cluster analysis) to support the development teams Skills in digital image processing, object-oriented programming (OOP) in Python, and knowledge of SQL are a strong advantage, adding significant value to this opportunity Good communication skills in a multicultural and multidisciplinary environment A thorough way of working and documentation Motivated team player with passion in promoting and driving fast-paced and ambitious projects Aptitude to understand and improve the underlying technical processes Proficiency in both English and German Unlimited project contract Fascinating, innovative environment in an international atmosphere Ihr Kontakt Ansprechpartner Jannik Fabio Eichin Referenznummer 865639/1 Kontakt aufnehmen E-Mail: jannik.eichin@hays.ch Anstellungsart Freiberuflich für ein Projekt
Working in an interdisciplinary team of engineers to develop and improve designs and manufacturing processes for thick film sensorsImprove and maintain the data infrastructure and pipeline for production and process control data from various sources and ensure timely data availabilityAct as a technical interface between R&D and Production and between various R&D departments to harmonize data handling and standardsImprove and maintain data visualization tools (dashboards, interactive charts) and support in routine data analysisSupport in defining and improving image analysis methods and tools to derive quantitative feature values from imagesExtend the data infrastructure with additional information, e.g. from sensor performance characterizationData driven improvements of manufacturing processes Completed technical training in process engineering, data science, bioinformatics, or similar professional educationProfessional experience in industrial R&D or manufacturing environment, ideally in the medical device industry or a comparable regulated environmentExperience in building and maintaining data pipelines (ETL processes) from diverse sources such as SQL databases, CSV, and machine log filesAbility to create interactive dashboards and visualization tools with a solid understanding of applied statistics (e.g. correlation analysis, cluster analysis) to support the development teamsSkills in digital image processing, object-oriented programming (OOP) in Python, and knowledge of SQL are a strong advantage, adding significant value to this opportunityGood communication skills in a multicultural and multidisciplinary environment A thorough way of working and documentationMotivated team player with passion in promoting and driving fast-paced and ambitious projectsAptitude to understand and improve the underlying technical processesProficiency in both English and German Unlimited project contractFascinating, innovative environment in an international atmosphere Ihr Kontakt Ansprechpartner Jannik Fabio Eichin Referenznummer 865639/1 Kontakt aufnehmen E-Mail: jannik.eichin@hays.ch Anstellungsart Freiberuflich für ein Projekt